On functions with zero mean over a finite group
نویسندگان
چکیده
منابع مشابه
Zero-cycles on varieties over finite fields
For any field k, Milnor [Mi] defined a sequence of groups K 0 (k), K M 1 (k), K M 2 (k), . . . which later came to be known as Milnor K-groups. These were studied extensively by Bass and Tate [BT], Suslin [Su], Kato [Ka1], [Ka2] and others. In [Som], Somekawa investigates a generalization of this definition proposed by Kato: given semi-abelian varieties G1, . . . , Gs over a field k, there is a...
متن کاملBent functions on a finite nonabelian group
We introduce the notion of a bent function on a finite nonabelian group which is a natural generalization of the well-known notion of bentness on a finite abelian group due to Logachev, Salnikov and Yashchenko. Using the theory of linear representations and noncommutative harmonic analysis of finite groups we obtain several properties of such functions similar to the corresponding properties of...
متن کاملOn the Maximal Cross Number of Unique Factorization Zero-sum Sequences over a Finite Abelian Group
Let S = (g1, · · · , gl) be a sequence of elements from an additive finite abelian group G, and let
متن کاملShifting Mean Activation Towards Zero with Bipolar Activation Functions
We propose a simple extension to the ReLU-family of activation functions that allows them to shift the mean activation across a layer towards zero. Combined with proper weight initialization, this alleviates the need for normalization layers. We explore the training of deep vanilla recurrent neural networks (RNNs) with up to 144 layers, and show that bipolar activation functions help learning i...
متن کاملShifting Mean Activation towards Zero with Bipolar Activation Functions
We propose a simple extension to the ReLU-family of activation functions that allows them to shift the mean activation across a layer towards zero. Combined with proper weight initialization, this alleviates the need for normalization layers. We explore the training of deep vanilla recurrent neural networks (RNNs) with up to 144 layers, and show that bipolar activation functions help learning i...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Functional Analysis and Its Applications
سال: 1997
ISSN: 0016-2663,1573-8485
DOI: 10.1007/bf02466011